DeepSeek Coder
✓Open-source MoE coding model (V2) with 128K context
Open Source Model
DeepSeek Coder V2 is an open Mixture-of-Experts coding model (16B Lite and 236B) with a 128K context window and support for 338 programming languages. Weights are published on Hugging Face for self-hosting; code is MIT-licensed and the model weights use the DeepSeek Model License. Hosted inference is available through the DeepSeek API using OpenAI-compatible endpoints.
Open Source
Key Features:
- •DeepSeek-Coder-V2 Mixture-of-Experts models (236B total / 21B active, plus 16B Lite / 2.4B active)
- •128K context window with fill-in-the-middle support
- •Supports 338 programming languages
All platforms via API
Learn more →