Note: This model is being deprecated. Recommended replacement is the newer Ministral 8B

This model is currently powered by Mistral-7B-v0.2, and incorporates a "better" fine-tuning than Mistral 7B, inspired by community work. It's best used for large batch processing tasks where cost is a significant factor but reasoning capabilities are not crucial.

Model Information

Model ID

mistralai/mistral-tiny

Context Length

32,768 tokens

Author

mistralai

Capabilities