Skip to content

feat: add MiniMax provider support#326

Open
octo-patch wants to merge 1 commit intowhylabs:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax provider support#326
octo-patch wants to merge 1 commit intowhylabs:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

This PR adds MiniMax LLM provider support to LangKit's extensible LLMInvocationParams architecture.

Changes

  • langkit/openai/openai.py: Added two new provider classes:
    • MiniMaxDefault — uses MiniMax-M2.7 model (default, peak performance)
    • MiniMaxHighSpeed — uses MiniMax-M2.7-highspeed model (faster, more agile)
    • Added create_minimax_chat_completion() helper supporting both legacy (v0) and modern (v1+) openai library versions
  • langkit/openai/__init__.py: Exported MiniMaxDefault and MiniMaxHighSpeed
  • langkit/tests/test_minimax.py: Added 11 unit tests covering model names, temperature defaults, copy behavior, completion calls, and custom base URL support

Usage

from langkit.openai import MiniMaxDefault, MiniMaxHighSpeed
from langkit.openai.openai import Conversation

# Set MINIMAX_API_KEY in environment
provider = MiniMaxDefault()
conv = Conversation(provider)
result = conv.send_prompt("Hello!")

Configuration

Env Var Description Default
MINIMAX_API_KEY MiniMax API key (required)
MINIMAX_BASE_URL Custom base URL https://api.minimax.io/v1

Key Constraints

  • Temperature default is 1.0 (MiniMax API requires range (0.0, 1.0])
  • Uses OpenAI-compatible API endpoint

API Reference

- Add MiniMaxDefault provider (MiniMax-M2.7 model)
- Add MiniMaxHighSpeed provider (MiniMax-M2.7-highspeed model)
- Use OpenAI-compatible API via https://api.minimax.io/v1
- Support MINIMAX_API_KEY and optional MINIMAX_BASE_URL env vars
- Default temperature set to 1.0 per MiniMax API requirements
- Export new classes from langkit.openai module
- Add unit tests for both MiniMax providers
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant