Moonshot AI: Moonlight 16B A3B Instruct

Moonlight-16B-A3B-Instruct is a 16B-parameter Mixture-of-Experts (MoE) language model developed by Moonshot AI. It is optimized for instruction-following tasks with 3B activated parameters per inference. The model advances the Pareto frontier in performance per FLOP across English, coding, math, and Chinese benchmarks. It outperforms comparable models like Llama3-3B and Deepseek-v2-Lite while maintaining efficient deployment capabilities through Hugging Face integration and compatibility with popular inference engines like vLLM12.

Model Information

Model ID

moonshotai/moonlight-16b-a3b-instruct

Context Length

8,192 tokens

Author

moonshotai

Capabilities