Jamba 1.5 Large is part of AI21's new family of open models, offering superior speed, efficiency, and quality.

It features a 256K effective context window, the longest among open models, enabling improved performance on tasks like document summarization and analysis.

Built on a novel SSM-Transformer architecture, it outperforms larger models like Llama 3.1 70B on benchmarks while maintaining resource efficiency.

Read their announcement to learn more.

Model Information

Model ID

ai21/jamba-1-5-large

Context Length

256,000 tokens

Author

ai21

Capabilities