← Back to cookbooks
Beginner · 25-40 min
Local coding stack with Continue and Ollama
Set up a privacy-first local AI coding workflow with Continue, Ollama, and project-level rules.
Last reviewed Feb 25, 2026
Why this stack
- local execution
- controllable model selection
- low latency for everyday edits
Setup outline
- Install Ollama and pull coding model(s)
- Install Continue extension in VS Code or JetBrains
- Configure Continue provider to target local Ollama endpoint
- Add project rules for style and architecture constraints
Daily workflow
- Ask for implementation plan
- Apply one patch at a time
- Run lint/tests locally
- Request a focused code review summary
Quality controls
- include repo conventions in rules
- enforce explicit test commands
- keep a short retry strategy for low-confidence responses
Good fits
- solo developer workflows
- privacy-sensitive repositories
- offline-capable coding environments