Zephyr 141B-A35B is A Mixture of Experts (MoE) model with 141B total parameters and 35B active parameters. Fine-tuned on a mix of publicly available, synthetic datasets.

It is an instruct finetune of Mixtral 8x22B.

#moe

Model Information

Model ID

huggingfaceh4/zephyr-orpo-141b-a35b

Context Length

65,536 tokens

Author

huggingfaceh4

Capabilities