fix: normalize text-only prompt atoms in deepseek and tongyi#2762
fix: normalize text-only prompt atoms in deepseek and tongyi#2762LeekJay wants to merge 2 commits intolanggenius:mainfrom
Conversation
…onversion This change fixes a plugin compatibility issue where Dify can forward prompt content as TextPromptMessageContent atoms instead of plain strings. Without normalization, deepseek and tongyi may pass those objects into provider serialization paths and fail at runtime. 变更: - add text-only prompt content normalization in deepseek before invoke cleanup - add text-only prompt content normalization in tongyi before provider conversion - keep mixed multimodal content untouched so existing provider handling remains intact 影响: - prevents TextPromptMessageContent serialization failures in plugin-based llm calls - reduces runtime PluginInvokeError and UnsupportedDataType failures for deepseek and tongyi workflows - keeps provider behavior unchanged for non-text and mixed-content prompts
Summary of ChangesHello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request resolves a compatibility issue within the Deepseek and Tongyi LLM plugins where Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for GitHub and other Google products, sign up here. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request addresses a runtime failure by normalizing text-only prompt atoms in the deepseek and tongyi plugins. The changes involve adding helper functions to convert lists of TextPromptMessageContent into single strings before they are processed by the provider-specific adapters.
My review has identified a couple of areas for improvement:
- There is significant code duplication between the
deepseekandtongyimodels. The new helper functions are identical in both files. Consolidating this logic into a shared module or base class would improve maintainability. - The new
_normalize_text_contentfunction is missing type hints, which is inconsistent with the surrounding code. Adding them would improve code clarity and maintainability.
Overall, the fix is well-described and addresses the issue. The suggested improvements are aimed at enhancing code quality.
Summary
This PR normalizes prompt message content in the
deepseekandtongyiofficial plugins before provider-specific conversion runs.It fixes a compatibility issue where Dify can pass prompt content as
list[TextPromptMessageContent], which currently leaks into JSON serialization or provider adapters and causes runtime failures such as:Object of type TextPromptMessageContent is not JSON serializableUnsupported atom data type: <class 'dify_plugin.entities.model.message.TextPromptMessageContent'>This is the same failure pattern we reproduced on Dify
1.13.1with:langgenius/deepseek:0.0.11langgenius/tongyi:0.1.33Related issues:
FIx #2753
Fix #2761
What changed
_normalize_text_content()helper todeepseekandtongyiLLM implementations._normalize_prompt_messages()helper to copy prompt messages and collapse pure text atom lists into plain strings.Why this is safe
The change is intentionally narrow:
TextPromptMessageContent.Validation
python -m py_compile models/deepseek/models/llm/llm.py models/tongyi/models/llm/llm.py1.13.1deployment, applied equivalent hot patches, restarted plugin daemon / workers, and confirmed the workflow stopped producingTextPromptMessageContent,UnsupportedDataType, andPluginInvokeErrorlog entries.