Jamba 1.5 Mini is the world's first production-grade Mamba-based model, combining SSM and Transformer architectures for a 256K context window and high efficiency.

It works with 9 languages and can handle various writing and analysis tasks as well as or better than similar small models.

This model uses less computer memory and works faster with longer texts than previous designs.

Read their announcement to learn more.

Model Information

Model ID

ai21/jamba-1-5-mini

Context Length

256,000 tokens

Author

ai21

Capabilities