Jamba Large 1.7 is the latest model in the Jamba open family, offering improvements in grounding, instruction-following, and overall efficiency. Built on a hybrid SSM-Transformer architecture with a 256K context window, it delivers more accurate, contextually grounded responses and better steerability than previous versions.

Model Information

Model ID

ai21/jamba-large-1.7

Context Length

256,000 tokens

Author

ai21

Capabilities