← Back to all tools
DeepSeek Coder logo

DeepSeek Coder

✓ Verified

Open-source MoE coding model (V2) with 128K context

Updated Feb 26, 2026
Open Source ModelOpen SourceAll platforms via API
Last reviewed: Feb 26, 2026Details verified against vendor changelogs and hands-on usage where possible.

About

DeepSeek Coder V2 is an open Mixture-of-Experts coding model (16B Lite and 236B) with a 128K context window and support for 338 programming languages. Weights are published on Hugging Face for self-hosting; code is MIT-licensed and the model weights use the DeepSeek Model License. Hosted inference is available through the DeepSeek API using OpenAI-compatible endpoints.

Key Features

  • DeepSeek-Coder-V2 Mixture-of-Experts models (236B total / 21B active, plus 16B Lite / 2.4B active)
  • 128K context window with fill-in-the-middle support
  • Supports 338 programming languages
  • Open weights for self-hosting with MIT code and DeepSeek Model License
  • OpenAI-compatible API access via the DeepSeek platform

Pros & Cons

Pros

  • Open weights for self-hosting (MIT code, DeepSeek Model License)
  • 128K context; 338 programming languages
  • OpenAI-compatible API at ~$0.28/$0.42 per 1M tokens

Cons

  • Self-hosted requires your own compute
  • No agentic mode

Use Cases

  • Budget-friendly AI coding at scale
  • Self-hosted AI coding solutions
  • Open-source development
  • Cost-sensitive production deployments

Technical Details

Languages

338 programming languages

AI Models

DeepSeek-Coder-V2-Instruct (236B / 21B active)DeepSeek-Coder-V2-Lite-Instruct (16B / 2.4B active)

Integrations

Self-hostedHugging FaceOpenAI-compatible API clients

Get AI Coding Tools Updates

New tools, comparisons, and insights. Join developers staying current with AI coding.

Frequently Asked Questions

What is DeepSeek Coder?

DeepSeek Coder V2 is an open Mixture-of-Experts coding model (16B Lite and 236B) with a 128K context window and support for 338 programming languages. Weights are published on Hugging Face for self-hosting; code is MIT-licensed and the model weights use the DeepSeek Model License. Hosted inference is available through the DeepSeek API using OpenAI-compatible endpoints.

Is DeepSeek Coder free?

Yes, DeepSeek Coder is open source and free to use. Weights on Hugging Face (16B Lite, 236B Base/Instruct), Code under MIT license; weights under DeepSeek Model License, Run on your own hardware or cloud

What programming languages does DeepSeek Coder support?

DeepSeek Coder supports 1+ programming languages including 338 programming languages.

What AI models does DeepSeek Coder use?

DeepSeek Coder is powered by DeepSeek-Coder-V2-Instruct (236B / 21B active), DeepSeek-Coder-V2-Lite-Instruct (16B / 2.4B active).

What platforms does DeepSeek Coder support?

DeepSeek Coder is available on All platforms via API.

What can DeepSeek Coder do?

DeepSeek Coder provides code completion, code generation, debugging, AI chat. Key features include: DeepSeek-Coder-V2 Mixture-of-Experts models (236B total / 21B active, plus 16B Lite / 2.4B active), 128K context window with fill-in-the-middle support, Supports 338 programming languages.

Related Articles

Visit DeepSeek Coder

Pricing and features change frequently—confirm on the vendor site.

We may earn a commission if you sign up. See our disclosure.

Pricing

Open weights (self-hosted)

$0

  • Weights on Hugging Face (16B Lite, 236B Base/Instruct)
  • Code under MIT license; weights under DeepSeek Model License
  • Run on your own hardware or cloud

DeepSeek API (V3.2 chat/reasoner)

Pay-as-you-go

  • Hosted access via DeepSeek platform
  • OpenAI-compatible API with JSON, tool calls, FIM, and caching

Company

Name
DeepSeek
Founded
2023
Location
China
Users
Open-source community

Links

Similar Tools

Compare DeepSeek Coder with these alternatives

Compare All

DeepSeek Coder

Open Source

Visit DeepSeek Coder