Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
116 changes: 91 additions & 25 deletions docs/guide/getting-started/architecture.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

## 2.4.1 Architecture Diagram

### Core Architecture

```mermaid
graph TB
subgraph "AmritaCore"
Expand All @@ -12,7 +14,7 @@ graph TB
E[Memory Model]
F[Agent Core]
end

UserInput[User Input] --> A
A --> B
A --> C
Expand All @@ -22,16 +24,51 @@ graph TB
D --> F
E --> F
F --> ResponseStream[Response Stream]

ResponseStream --> UserOutput[User Output]
F --> E

LLM[LLM Provider] <---> F

style AmritaCore fill:#e1f5fe,stroke:#0277bd,stroke-width:2px
style F fill:#fff3e0,stroke:#f57c00,stroke-width:2px
```

### Session and Global Data Container Architecture

#### Global Container and Session Conversation Context

```mermaid
graph TB
subgraph "Global Container"
G_Tools[Global Tools]
G_Presets[Global Presets]
G_Config[Global Configuration]
end

subgraph "SessionsManager"
S1[Session 1<br/>Conversation Context]
S2[Session 2<br/>Conversation Context]
SN[Session N<br/>Conversation Context]
end

subgraph "Single Session Structure"
Mem[Memory Model<br/>Conversation History]
Tools[Tools Manager<br/>Current Session Tools]
Conf[Configuration<br/>Current Session Config]
end

G_Tools -.-> Tools
G_Presets -.-> S1
G_Config -.-> Conf

style Global_Container fill:#e8f5e9,stroke:#2e7d32,stroke-width:2px
style Session_Container fill:#e3f2fd,stroke:#1565c0,stroke-width:2px
style S1 fill:#bbdefb,stroke:#1565c0,stroke-width:1px
style S2 fill:#bbdefb,stroke:#1565c0,stroke-width:1px
style SN fill:#bbdefb,stroke:#1565c0,stroke-width:1px
```

## 2.4.2 Core Component Relationships

- **ChatObject**: The main interaction point that manages a single conversation
Expand All @@ -40,31 +77,60 @@ graph TB
- **Tools Manager**: Extends the agent's capabilities with external functions
- **Memory Model**: Maintains conversation context and history
- **Agent Core**: The central processing unit coordinating all components
- **SessionsManager**: Manages multiple isolated sessions, each as an independent conversation context
- **Session (Conversation Context)**: Stores all relevant information for a specific user or specific conversation, including memory model, tools, configurations, etc.

## 2.4.3 Data Flow Explanation
## 2.4.3 Agent Loop and Session Isolation Mechanism

```mermaid
sequenceDiagram
User->>CO: Send input message
CO->>Config: Check configuration
CO->>Events: Trigger input events
CO->>Agent: Initialize processing
Agent->>Memory: Load conversation context
Agent->>Tools: Check for required tools
Agent->>LLM: Format and send request
participant User1 as User1
participant User2 as User2
participant SM as SessionsManager
participant S1 as Session 1<br/>(Conversation Context 1)
participant S2 as Session 2<br/>(Conversation Context 2)
participant Agent as Agent Core
participant LLM as LLM Provider

Note over User1,LLM: Agent Loop Begins
User1->>SM: Request to create Session 1
User2->>SM: Request to create Session 2
SM-->>User1: Return Session ID 1
SM-->>User2: Return Session ID 2

Note over User1,User2: Each user interacts in their respective conversation context

User1->>S1: Send message to Session 1
S1->>S1: Initialize ChatObject
S1->>Agent: Start Agent Loop to process request

User2->>S2: Send message to Session 2
S2->>S2: Initialize ChatObject
S2->>Agent: Start Agent Loop to process request

par Parallel Processing of Two Conversation Contexts
Agent->>S1: Process in Session 1 Context
Agent->>S2: Process in Session 2 Context

Agent->>S1: Update Session 1 Memory Model
Agent->>S2: Update Session 2 Memory Model
end

Agent->>LLM: Send request (from Session 1)
LLM-->>Agent: Return response
Agent->>Memory: Update conversation state
Agent->>Events: Trigger output events
CO-->>User: Stream response
Agent-->>S1: Update Session 1 State
S1-->>User1: Stream response

Agent->>LLM: Send request (from Session 2)
LLM-->>Agent: Return response
Agent-->>S2: Update Session 2 State
S2-->>User2: Stream response

Note over User1,LLM: Each conversation context maintains independent history and state
```

1. User input enters through a ChatObject
2. Configuration determines processing behavior
3. The input triggers various events in the Events System
4. Agent Core loads the conversation context from Memory Model
5. Agent Core checks if any tools need to be called based on the input
6. The processed input is sent to the LLM Provider
7. The response from LLM is handled by Agent Core
8. Memory Model is updated with the new conversation state
9. Output events may intercept and modify the final response
10. The response is streamed back to the user via ChatObject
1. **Session as Conversation Context**: Each Session represents an independent conversation context, storing all relevant information for a specific user or specific conversation
2. **Global Data Container**: SessionsManager manages all active conversation contexts, providing global resource sharing
3. **Agent Loop**: Inside each conversation context, the Agent Core executes the complete processing loop
4. **Context Isolation**: Data between different conversation contexts is completely isolated, ensuring conversation histories don't mix
5. **Global Resource Sharing**: Each conversation context can access resources from the Global container, but maintains its own independent state
110 changes: 88 additions & 22 deletions docs/zh/guide/getting-started/architecture.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

## 2.4.1 架构图

### Core Architecture

```mermaid
graph TB
subgraph "AmritaCore"
Expand Down Expand Up @@ -32,6 +34,41 @@ graph TB
style F fill:#fff3e0,stroke:#f57c00,stroke-width:2px
```

### Session 与 Global 数据容器架构

#### Global 全局容器与 Session 对话上下文

```mermaid
graph TB
subgraph "Global 全局容器"
G_Tools[全局工具]
G_Presets[全局预设]
G_Config[全局配置]
end

subgraph "SessionsManager"
S1[Session 1<br/>对话上下文]
S2[Session 2<br/>对话上下文]
SN[Session N<br/>对话上下文]
end

subgraph "单个 Session 结构"
Mem[记忆模型<br/>对话历史]
Tools[工具管理器<br/>当前会话工具]
Conf[配置<br/>当前会话配置]
end

G_Tools -.-> Tools
G_Presets -.-> S1
G_Config -.-> Conf

style Global_Container fill:#e8f5e9,stroke:#2e7d32,stroke-width:2px
style Session_Container fill:#e3f2fd,stroke:#1565c0,stroke-width:2px
style S1 fill:#bbdefb,stroke:#1565c0,stroke-width:1px
style S2 fill:#bbdefb,stroke:#1565c0,stroke-width:1px
style SN fill:#bbdefb,stroke:#1565c0,stroke-width:1px
```

## 2.4.2 核心组件关系

- **ChatObject**: 管理单个对话的主要交互点
Expand All @@ -40,31 +77,60 @@ graph TB
- **工具管理器**: 通过外部函数扩展Agent功能
- **记忆模型**: 维护对话上下文和历史记录
- **Agent核心**: 协调所有组件的中央处理单元
- **SessionsManager**: 管理多个独立的会话,每个会话都是一个独立的对话上下文
- **Session(对话上下文)**: 保存特定用户或特定对话的所有相关信息,包括记忆模型、工具、配置等

## 2.4.3 数据流说明
## 2.4.3 Agent 循环与 Session 隔离机制

```mermaid
sequenceDiagram
用户->>ChatObject: 发送输入消息
ChatObject->>配置: 检查配置
ChatObject->>事件系统: 触发输入事件
ChatObject->>Agent核心: 初始化处理
Agent核心->>记忆模型: 加载对话上下文
Agent核心->>工具管理器: 检查所需的工具
Agent核心->>LLM 提供商: 格式化并发送请求
LLM 提供商-->>Agent核心: 返回响应
Agent核心->>记忆模型: 更新对话状态
Agent核心->>事件系统: 触发输出事件
ChatObject-->>用户: 流式传输响应
participant User1 as 用户1
participant User2 as 用户2
participant SM as SessionsManager
participant S1 as Session 1<br/>(对话上下文1)
participant S2 as Session 2<br/>(对话上下文2)
participant Agent as Agent核心
participant LLM as LLM 提供商

Note over User1,LLM: Agent 循环开始
User1->>SM: 请求创建 Session 1
User2->>SM: 请求创建 Session 2
SM-->>User1: 返回 Session ID 1
SM-->>User2: 返回 Session ID 2

Note over User1,User2: 每个用户在各自的对话上下文中交互

User1->>S1: 发送消息到 Session 1
S1->>S1: 初始化 ChatObject
S1->>Agent: 启动 Agent 循环处理请求

User2->>S2: 发送消息到 Session 2
S2->>S2: 初始化 ChatObject
S2->>Agent: 启动 Agent 循环处理请求

par 并行处理两个对话上下文
Agent->>S1: 在 Session 1 上下文中处理
Agent->>S2: 在 Session 2 上下文中处理

Agent->>S1: 更新 Session 1 记忆模型
Agent->>S2: 更新 Session 2 记忆模型
end

Agent->>LLM: 发送请求 (来自 Session 1)
LLM-->>Agent: 返回响应
Agent-->>S1: 更新 Session 1 状态
S1-->>User1: 流式传输响应

Agent->>LLM: 发送请求 (来自 Session 2)
LLM-->>Agent: 返回响应
Agent-->>S2: 更新 Session 2 状态
S2-->>User2: 流式传输响应

Note over User1,LLM: 每个对话上下文保持独立的历史和状态
```

1. 用户输入通过 ChatObject 进入
2. 配置确定处理行为
3. 输入触发事件系统中的各种事件
4. Agent核心从记忆模型加载对话上下文
5. Agent核心根据输入检查是否需要调用任何工具
6. 处理后的输入发送到 LLM 提供商
7. LLM 的响应由Agent核心处理
8. 记忆模型使用新的对话状态进行更新
9. 输出事件可能会拦截并修改最终响应
10. 响应通过 ChatObject 流式传输回用户
1. **Session 作为对话上下文**: 每个 Session 代表一个独立的对话上下文,保存特定用户或特定对话的所有相关信息
2. **Global 数据容器**: SessionsManager 管理所有活动的对话上下文,提供全局资源共享
3. **Agent 循环**: 在每个对话上下文内部,Agent 核心执行完整的处理循环
4. **上下文隔离**: 不同对话上下文之间的数据完全隔离,确保对话历史不混淆
5. **全局资源共享**: 每个对话上下文可以访问 Global 容器中的公共资源,但拥有各自独立的状态
5 changes: 4 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,11 +1,14 @@
[project]
name = "amrita_core"
version = "0.4.0.post1"
version = "0.4.1"
description = "Agent core of Project Amrita"
readme = "README.md"
requires-python = ">=3.10,<3.14"
dependencies = [
"aiofiles>=25.1.0",
"aiohttp>=3.13.3",
"fastmcp>=2.14.4",
"filetype>=1.2.0",
"jieba>=0.42.1",
"loguru>=0.7.3",
"openai>=2.16.0",
Expand Down
12 changes: 8 additions & 4 deletions src/amrita_core/builtins/adapter.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,11 @@

from amrita_core.config import AmritaConfig
from amrita_core.logging import debug_log
from amrita_core.protocol import ModelAdapter
from amrita_core.protocol import (
COMPLETION_RETURNING,
ModelAdapter,
StringMessageContent,
)
from amrita_core.tools.models import ToolChoice, ToolFunctionSchema
from amrita_core.types import (
ModelConfig,
Expand All @@ -36,7 +40,7 @@ class OpenAIAdapter(ModelAdapter):
@override
async def call_api(
self, messages: Iterable[ChatCompletionMessageParam]
) -> AsyncGenerator[str | UniResponse[str, None], None]:
) -> AsyncGenerator[COMPLETION_RETURNING, None]:
"""Call OpenAI API to get chat responses"""
preset: ModelPreset = self.preset
preset_config: ModelConfig = preset.config
Expand Down Expand Up @@ -81,7 +85,7 @@ async def call_api(
)
if chunk.choices[0].delta.content is not None:
response += chunk.choices[0].delta.content
yield chunk.choices[0].delta.content
yield StringMessageContent(response)
debug_log(chunk.choices[0].delta.content)
except IndexError:
break
Expand All @@ -93,7 +97,7 @@ async def call_api(
if completion.choices[0].message.content is not None
else ""
)
yield response
yield StringMessageContent(response)
if completion.usage:
uni_usage = UniResponseUsage.model_validate(
completion.usage, from_attributes=True
Expand Down
2 changes: 1 addition & 1 deletion src/amrita_core/builtins/agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
from copy import deepcopy
from typing import Any

from amrita_core.chatmanager import MessageWithMetadata
from amrita_core.config import AmritaConfig, get_config
from amrita_core.hook.event import CompletionEvent, PreCompletionEvent
from amrita_core.hook.exception import MatcherException as ProcEXC
Expand All @@ -14,6 +13,7 @@
tools_caller,
)
from amrita_core.logging import debug_log, logger
from amrita_core.protocol import MessageWithMetadata
from amrita_core.sessions import SessionsManager
from amrita_core.tools.manager import ToolsManager, on_tools
from amrita_core.tools.models import ToolContext
Expand Down
Loading