← Back to cookbooks

Beginner · 25-40 min

Local coding stack with Continue and Ollama

Set up a privacy-first local AI coding workflow with Continue, Ollama, and project-level rules.

Last reviewed Feb 25, 2026

Why this stack

  • local execution
  • controllable model selection
  • low latency for everyday edits

Setup outline

  1. Install Ollama and pull coding model(s)
  2. Install Continue extension in VS Code or JetBrains
  3. Configure Continue provider to target local Ollama endpoint
  4. Add project rules for style and architecture constraints

Daily workflow

  1. Ask for implementation plan
  2. Apply one patch at a time
  3. Run lint/tests locally
  4. Request a focused code review summary

Quality controls

  • include repo conventions in rules
  • enforce explicit test commands
  • keep a short retry strategy for low-confidence responses

Good fits

  • solo developer workflows
  • privacy-sensitive repositories
  • offline-capable coding environments

MCP servers used