Mistral: Magistral Small 2506
Mistral: Magistral Small 2506

Magistral Small is a 24B parameter instruction-tuned model based on Mistral-Small-3.1 (2503), enhanced through supervised fine-tuning on traces from Magistral Medium and further refined via reinforcement learning. It is optimized for reasoning and supports a wide multilingual range, including over 20 languages.

Model Information

Model ID

mistralai/magistral-small-2506

Context Length

40,000 tokens

Author

mistralai

Capabilities