Open Source Model
Open-weight coding models for local or self-hosted usage
Open-source models can be run locally or self-hosted, giving you control over data, cost, and deployment. They are often smaller or more specialized than frontier commercial models but avoid vendor lock-in and API dependency.
Running open-source models requires hardware (GPU recommended), tooling like Ollama or llama.cpp, and sometimes fine-tuning for specific tasks. Many coding tools—Continue, Cline, OpenCode, Aider—support local models via OpenAI-compatible APIs, so you can swap between cloud and local without changing your workflow.
For setup guidance, see our local AI coding with Continue and Ollama and local stack cookbook. For tools that support open-source models, browse Continue and Cline.