Deep Cogito: Cogito V2 Preview Deepseek 671B
Deep Cogito: Cogito V2 Preview Deepseek 671B

Cogito v2 is a multilingual, instruction-tuned Mixture of Experts (MoE) large language model with 671 billion parameters. It supports both standard and reasoning-based generation modes. The model introduces hybrid reasoning via Iterated Distillation and Amplification (IDA)—an iterative self-improvement strategy designed to scale alignment with general intelligence. Cogito v2 has been optimized for STEM, programming, instruction following, and tool use. It supports 128k context length and offers strong performance in both multilingual and code-heavy environments. Users can control the reasoning behaviour with the reasoning enabled boolean. Learn more in our docs

How Can I Help?
Type here to start conversation