Skip to content

feat: add MiniMax as LLM provider#838

Merged
harry0703 merged 2 commits intoharry0703:mainfrom
octo-patch:add-minimax-llm-provider
Apr 2, 2026
Merged

feat: add MiniMax as LLM provider#838
harry0703 merged 2 commits intoharry0703:mainfrom
octo-patch:add-minimax-llm-provider

Conversation

@octo-patch
Copy link
Copy Markdown
Contributor

@octo-patch octo-patch commented Mar 13, 2026

Summary

Add MiniMax as an LLM provider for MoneyPrinterTurbo, using the OpenAI-compatible API.

Changes

  • Add MiniMax provider routing in app/services/llm.py
  • Add MiniMax configuration (API key, base URL, model name) in config.example.toml
  • Default model: MiniMax-M2.7 (also available: MiniMax-M2.7-highspeed)
  • Update README (CN & EN) to list MiniMax as a supported provider

Configuration

llm_provider = "minimax"
minimax_api_key = "your-api-key"
minimax_base_url = "https://api.minimax.io/v1"
minimax_model_name = "MiniMax-M2.7"

API Documentation

octo-patch and others added 2 commits March 13, 2026 20:36
Add MiniMax (https://platform.minimaxi.com) as a new LLM provider option.
MiniMax API is OpenAI-compatible and supports models like MiniMax-M1.

Changes:
- Add minimax provider branch in app/services/llm.py
- Add minimax configuration in config.example.toml
- Update both README.md and README-en.md to list MiniMax
- Update default model from MiniMax-M1 to MiniMax-M2.7
- Add MiniMax-M2.7-highspeed as available model option
- Update platform URL to platform.minimax.io
@harry0703 harry0703 merged commit 3011c3b into harry0703:main Apr 2, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants