Skip to content

Add MiniMax as a first-class LLM provider (M2.7 default)#1834

Open
octo-patch wants to merge 2 commits intodottxt-ai:mainfrom
octo-patch:feature/add-minimax-provider
Open

Add MiniMax as a first-class LLM provider (M2.7 default)#1834
octo-patch wants to merge 2 commits intodottxt-ai:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

@octo-patch octo-patch commented Mar 17, 2026

Summary

  • Add MiniMax as a first-class LLM provider with full sync/async support
  • MiniMax-M2.7 set as default model (latest flagship with enhanced reasoning and coding)
  • MiniMax-M2.7-highspeed available for low-latency scenarios
  • Previous models (M2.5, M2.5-highspeed) retained as alternatives

Changes

  • MiniMax and AsyncMiniMax model classes with temperature clamping and thinking-tag stripping
  • MiniMaxTypeAdapter with json_object fallback (MiniMax does not support strict json_schema)
  • from_minimax() factory function for easy setup
  • Comprehensive unit tests (37 tests) and integration tests

Why

MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities, available via an OpenAI-compatible API at https://api.minimax.io/v1.

Testing

  • All 37 unit tests passing
  • Integration tested with MiniMax API (M2.7 model)

PR Bot added 2 commits March 17, 2026 19:22
Add MiniMax Cloud API integration using the OpenAI-compatible SDK.
MiniMax offers M2.5 and M2.5-highspeed models with 204K context windows.

Key features:
- MiniMax and AsyncMiniMax model classes with sync/async support
- MiniMaxTypeAdapter handling json_object mode fallback
- Temperature clamping (MiniMax requires temp in (0.0, 1.0])
- Automatic stripping of thinking tags and markdown code fences
- from_minimax() factory function registered in outlines namespace
- 37 unit tests + 11 integration tests covering all code paths

Files changed:
- outlines/models/minimax.py (new)
- outlines/models/__init__.py (register provider)
- tests/models/test_minimax.py (new)
- tests/models/test_minimax_type_adapter.py (new)
- README.md (add MiniMax to API support list)
- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model documentation
- Set MiniMax-M2.7 as default model in examples and tests
- Keep all previous models (M2.5, M2.5-highspeed) as alternatives
- Update related unit tests
@octo-patch octo-patch changed the title Add MiniMax as a first-class LLM provider Add MiniMax as a first-class LLM provider (M2.7 default) Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant