
Continue
✓ VerifiedOpen-source, model-agnostic AI coding assistant for VS Code and JetBrains
Our Review
Continue is an open-source VS Code and JetBrains extension for AI-assisted coding. It supports local models (Ollama, llama.cpp) and cloud providers (OpenAI, Anthropic, Gemini, DeepSeek) via your own API keys. There is no subscription for the extension; you pay only for external API usage if you choose cloud models.
We tested Continue with both local and cloud setups. For privacy-sensitive work, running Ollama locally gives you completions and chat without sending code off-device. For more capable models, connecting OpenAI or Anthropic keys is straightforward. The interface is familiar if you have used Copilot or Cursor Tab: inline completions, a chat panel, and codebase context.
Continue is ideal for developers who want open-source tooling, local model support, or maximum control over model choice and spend. It does not offer the same level of agentic multi-file automation as Cursor or Aider, but for daily autocomplete and chat it is solid. Teams that need SSO or central billing will need to look at commercial offerings.
Review updated December 6, 2025.
About
Continue is an open-source coding assistant that lets developers use their own models (OpenAI, Anthropic, local via Ollama/LM Studio) with custom prompts and workflows. It runs as a VS Code and JetBrains extension, supports chat and completions, and can be self-hosted or run fully local for privacy.
Key Features
- ✓Model-agnostic with BYO keys
- ✓Open-source and customizable prompts/commands
- ✓Local/offline support via Ollama or LM Studio
- ✓Context providers for codebase awareness
- ✓Chat, completions, and slash commands
- ✓Self-host or run fully locally for privacy
Pros & Cons
Pros
- Open-source and model-agnostic with BYO API keys
- Local/offline support via Ollama or LM Studio
- VS Code and JetBrains extensions; zero data retention
Cons
- No agentic mode
- No built-in codebase indexing
Use Cases
- →Privacy-focused development with local models
- →Custom AI workflows with personalized prompts
- →Self-hosted AI coding for enterprises
- →Offline development with local LLMs
- →Experimenting with different AI models
- →Open-source projects and contributions
Technical Details
Languages
AI Models
Integrations
Get AI Coding Tools Updates
New tools, comparisons, and insights. Join developers staying current with AI coding.
Frequently Asked Questions
What is Continue?
Continue is an open-source coding assistant that lets developers use their own models (OpenAI, Anthropic, local via Ollama/LM Studio) with custom prompts and workflows. It runs as a VS Code and JetBrains extension, supports chat and completions, and can be self-hosted or run fully local for privacy.
Is Continue free?
Yes, Continue is open source and free to use. VS Code and JetBrains extensions, Chat and code completions, Custom prompts and slash commands
What programming languages does Continue support?
Continue supports 1+ programming languages including All languages supported by chosen model.
What AI models does Continue use?
Continue is powered by Any LLM via API, OpenAI, Anthropic, Gemini, Local (Ollama/LM Studio), Custom endpoints.
What platforms does Continue support?
Continue is available on macOS, Linux, Windows.
What can Continue do?
Continue provides code completion, code generation, debugging, AI chat. Key features include: Model-agnostic with BYO keys, Open-source and customizable prompts/commands, Local/offline support via Ollama or LM Studio.
Related Articles
Open-Weight Models Closing the Gap: GPT-OSS, Qwen3, Llama 4
A practical look at how open-weight coding models are catching up to frontier models: what's available and when to use them.
How to Set Up Ollama + Continue for Fully Private AI Coding
A step-by-step guide to running AI coding entirely on your machine with Ollama and Continue: zero cloud, zero API keys, full privacy.
The MCP Revolution: How One Protocol Connects Every AI Tool
A look at how Model Context Protocol (MCP) is unifying the AI coding ecosystem: one protocol, many servers, universal compatibility.
Workflow Resources
Cookbook
Local coding stack with Continue and Ollama
Set up a privacy-first local AI coding workflow with Continue, Ollama, and project-level rules.
Skill
Local model quality loop
Improve code output quality when using local AI models by combining rules files, iterative retries with error feedback, and test-backed validation gates.
Skill
Retrieval grounding pattern
Ground AI coding decisions in real documentation and repository sources before generating code, eliminating hallucinated APIs and outdated patterns.
MCP Server
Documentation MCP Server
MCP server pattern for giving AI coding agents direct access to versioned documentation, internal playbooks, and API references to reduce hallucinated guidance.
MCP Server
Filesystem MCP Server
Reference MCP server that grants AI coding agents controlled read/write access to local files and directories within sandboxed project boundaries.
Pricing and features change frequently—confirm on the vendor site.
We may earn a commission if you sign up. See our disclosure.
Pricing
Free (Open Source)
$0
- VS Code and JetBrains extensions
- Chat and code completions
- Custom prompts and slash commands
- Local/self-hosted model support (Ollama, LM Studio)
- BYO API keys for hosted models
Company
- Name
- Continue (open source)
- Founded
- 2023
- Location
Links
Similar Tools
Compare Continue with these alternatives
Continue
Open Source