Merged
Conversation
Contributor
审查者指南(在小型 PR 上默认折叠)审查者指南将消息丢弃逻辑重构为可复用方法,以正确处理成对的工具消息,收紧基于内存与 Token 的截断行为,修复 MCP 客户端重新初始化时传递正确对象的问题,改进 MCP 连接错误日志记录,并调整聊天任务运行标记的设置时机,同时提升补丁版本号。 MCP 客户端重新初始化与改进错误日志的时序图sequenceDiagram
participant Manager as MCPClientManager
participant Client as MCPClient
participant Server as MCPServer
participant Logger as Logger
Manager->>Manager: reinitalize_all()
activate Manager
loop for each client in clients copy
Manager->>Manager: unregister_client(client.server_script, False)
Manager->>Manager: _load_this(client, fail_then_raise=False)
activate Manager
Manager->>Client: connect()
alt connection fails
Client-->>Manager: raise Exception e
Manager->>Logger: opt(exception=e, colors=True).error(failed message)
else connection succeeds
Client-->>Manager: tools_remapping_tmp
Manager->>Manager: tools_remapping.update(tools_remapping_tmp)
Manager->>Logger: info(success message)
end
deactivate Manager
end
deactivate Manager
更新后 ChatManager 消息丢弃与限制行为的类图classDiagram
class ChatManager {
List messages
List _dropped_messages
Config config
datetime last_call
bool _pending
bool _is_running
Memory memory
void _make_abstract()
void _drop_message()
void run_enforce()
void _limit_length()
int get_token(memory)
void __call__(event, matcher, bot)
}
class Memory {
MessageList memory
}
class MessageList {
List messages
}
class Message {
string role
string content
}
class Config {
LLMConfig llm_config
SessionConfig session
}
class LLMConfig {
int memory_lenth_limit
}
class SessionConfig {
int session_max_tokens
}
ChatManager --> Memory
Memory --> MessageList
MessageList --> Message
ChatManager --> Config
Config --> LLMConfig
Config --> SessionConfig
更新后 MCPClientManager 加载行为的类图classDiagram
class MCPClientManager {
List clients
dict tools_remapping
Lock _lock
void _load_this(client, fail_then_raise)
void reinitalize_all()
void initialize_all(lock)
void unregister_client(server_script, remove_from_list)
}
class MCPClient {
string server_script
void connect()
}
class Logger {
void opt(exception, colors)
void error(message)
void info(message)
}
MCPClientManager --> MCPClient
MCPClientManager --> Logger
MCPClientManager --> Lock
文件级改动
提示与命令与 Sourcery 交互
自定义使用体验访问你的 控制面板 以:
获取帮助Original review guide in EnglishReviewer's guide (collapsed on small PRs)Reviewer's GuideRefactors message-dropping logic into a reusable method that correctly handles paired tool messages, tightens memory and token-based truncation behavior, fixes MCP client reinitialization to pass the correct object, improves MCP connection error logging, and adjusts the chat task running flag timing, along with a patch version bump. Sequence diagram for MCP client reinitialization and improved error loggingsequenceDiagram
participant Manager as MCPClientManager
participant Client as MCPClient
participant Server as MCPServer
participant Logger as Logger
Manager->>Manager: reinitalize_all()
activate Manager
loop for each client in clients copy
Manager->>Manager: unregister_client(client.server_script, False)
Manager->>Manager: _load_this(client, fail_then_raise=False)
activate Manager
Manager->>Client: connect()
alt connection fails
Client-->>Manager: raise Exception e
Manager->>Logger: opt(exception=e, colors=True).error(failed message)
else connection succeeds
Client-->>Manager: tools_remapping_tmp
Manager->>Manager: tools_remapping.update(tools_remapping_tmp)
Manager->>Logger: info(success message)
end
deactivate Manager
end
deactivate Manager
Class diagram for updated ChatManager message dropping and limitsclassDiagram
class ChatManager {
List messages
List _dropped_messages
Config config
datetime last_call
bool _pending
bool _is_running
Memory memory
void _make_abstract()
void _drop_message()
void run_enforce()
void _limit_length()
int get_token(memory)
void __call__(event, matcher, bot)
}
class Memory {
MessageList memory
}
class MessageList {
List messages
}
class Message {
string role
string content
}
class Config {
LLMConfig llm_config
SessionConfig session
}
class LLMConfig {
int memory_lenth_limit
}
class SessionConfig {
int session_max_tokens
}
ChatManager --> Memory
Memory --> MessageList
MessageList --> Message
ChatManager --> Config
Config --> LLMConfig
Config --> SessionConfig
Class diagram for updated MCPClientManager loading behaviorclassDiagram
class MCPClientManager {
List clients
dict tools_remapping
Lock _lock
void _load_this(client, fail_then_raise)
void reinitalize_all()
void initialize_all(lock)
void unregister_client(server_script, remove_from_list)
}
class MCPClient {
string server_script
void connect()
}
class Logger {
void opt(exception, colors)
void error(message)
void info(message)
}
MCPClientManager --> MCPClient
MCPClientManager --> Logger
MCPClientManager --> Lock
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
Contributor
There was a problem hiding this comment.
Hey - 我发现了 1 个问题,并给出了一些整体反馈:
- 在
_drop_message中,你在第一次pop之后假定data.memory.messages[0]一定存在;如果这个 helper 将来在其他地方复用,为了避免潜在的 IndexError,更安全的做法是在访问下标 0 前重新检查一次len(data.memory.messages)。 - 在
_make_abstract中,将self._dropped_messages + dropped_part改为只使用dropped_part,意味着之前被丢弃的消息将不再包含在摘要中;请再次确认这一行为变更是否是你在摘要逻辑中有意为之。
面向 AI Agent 的提示词
Please address the comments from this code review:
## Overall Comments
- In `_drop_message`, you assume `data.memory.messages[0]` exists after the first pop; if this helper is reused elsewhere in the future, it would be safer to re-check `len(data.memory.messages)` before accessing index 0 to avoid potential IndexError.
- The change in `_make_abstract` from `self._dropped_messages + dropped_part` to only `dropped_part` means previously dropped messages are no longer included in the summary; please double-check that this behavior change is intentional for your summarization logic.
## Individual Comments
### Comment 1
<location> `amrita/plugins/chat/chatmanager.py:249-254` </location>
<code_context>
else:
debug_log("未进行上下文摘要")
+ def _drop_message(self):
+ data = self.memory
+ if len(data.memory.messages) < 2:
+ return
+ self._dropped_messages.append(data.memory.messages.pop(0))
+ if data.memory.messages[0].role == "tool":
+ self._dropped_messages.append(data.memory.messages.pop(0))
+
</code_context>
<issue_to_address>
**issue (bug_risk):** Potential infinite loop if `memory_lenth_limit` can be 0 and `_drop_message` is a no-op for a single message.
Because `_drop_message` returns early when `len(data.memory.messages) < 2`, it can be a no-op. In `_limit_length`, this is called inside `while len(data.memory.messages) > 0:` with only a `len(messages) > memory_lenth_limit` guard. If `memory_lenth_limit` can be 0, then with exactly one `assistant`/`user` message left, the condition stays true, `_drop_message()` does nothing, and the loop never makes progress. Consider either guaranteeing `_drop_message` always removes at least one message when invoked, or adding a guard (e.g. `len(messages) > 1` or an explicit `limit == 0` case) before calling it.
</issue_to_address>帮我变得更有用!请在每条评论上点击 👍 或 👎,我会根据这些反馈改进后续的审查。
Original comment in English
Hey - I've found 1 issue, and left some high level feedback:
- In
_drop_message, you assumedata.memory.messages[0]exists after the first pop; if this helper is reused elsewhere in the future, it would be safer to re-checklen(data.memory.messages)before accessing index 0 to avoid potential IndexError. - The change in
_make_abstractfromself._dropped_messages + dropped_partto onlydropped_partmeans previously dropped messages are no longer included in the summary; please double-check that this behavior change is intentional for your summarization logic.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- In `_drop_message`, you assume `data.memory.messages[0]` exists after the first pop; if this helper is reused elsewhere in the future, it would be safer to re-check `len(data.memory.messages)` before accessing index 0 to avoid potential IndexError.
- The change in `_make_abstract` from `self._dropped_messages + dropped_part` to only `dropped_part` means previously dropped messages are no longer included in the summary; please double-check that this behavior change is intentional for your summarization logic.
## Individual Comments
### Comment 1
<location> `amrita/plugins/chat/chatmanager.py:249-254` </location>
<code_context>
else:
debug_log("未进行上下文摘要")
+ def _drop_message(self):
+ data = self.memory
+ if len(data.memory.messages) < 2:
+ return
+ self._dropped_messages.append(data.memory.messages.pop(0))
+ if data.memory.messages[0].role == "tool":
+ self._dropped_messages.append(data.memory.messages.pop(0))
+
</code_context>
<issue_to_address>
**issue (bug_risk):** Potential infinite loop if `memory_lenth_limit` can be 0 and `_drop_message` is a no-op for a single message.
Because `_drop_message` returns early when `len(data.memory.messages) < 2`, it can be a no-op. In `_limit_length`, this is called inside `while len(data.memory.messages) > 0:` with only a `len(messages) > memory_lenth_limit` guard. If `memory_lenth_limit` can be 0, then with exactly one `assistant`/`user` message left, the condition stays true, `_drop_message()` does nothing, and the loop never makes progress. Consider either guaranteeing `_drop_message` always removes at least one message when invoked, or adding a guard (e.g. `len(messages) > 1` or an explicit `limit == 0` case) before calling it.
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary by Sourcery
优化聊天消息丢弃和令牌限制行为、改进 MCP 客户端重连机制,并提升项目版本号。
Bug 修复:
增强:
构建:
Original summary in English
Summary by Sourcery
Refine chat message dropping and token-limiting behavior, improve MCP client reconnection, and bump the project version.
Bug Fixes:
Enhancements:
Build: