AI Coding Cookbooks
Practical workflows that show how to use modern coding agents, APIs, and local stacks for real engineering tasks.
Featured workflows
Advanced · 45-90 min
Claude Code workflow for large repository refactors
Use Claude Code in terminal-centric repos to run plan-first refactors with deterministic checkpoints.
Intermediate · 35-60 min
Cursor Cloud Agent workflow for multi-file refactors
A practical workflow for using Cursor Cloud Agents to plan, execute, and validate large refactors across multiple files.
Beginner · 25-40 min
Local coding stack with Continue and Ollama
Set up a privacy-first local AI coding workflow with Continue, Ollama, and project-level rules.
All cookbooks
Updated Feb 25, 2026
Claude Code workflow for large repository refactors
Goal Refactor highimpact code safely without losing observability. Recommended execution pattern 1) Start from a scoped spec Write a short spec in CLAUDE.md with: objective nongoals files in scope pass/fail te...
Tools: claude-code, aider
Updated Feb 25, 2026
OpenAI Codex API agent loop for implementation tasks
Objective Use API calls to run a structured coding loop with explicit validation gates. Loop design Stage A: Plan Prompt for: impacted modules implementation sequence test commands expected failure modes Sta...
Tools: openai-codex, openai-api
Updated Feb 25, 2026
Cursor Cloud Agent workflow for multi-file refactors
Outcome Ship a safe refactor in one branch with: an explicit plan bounded change scope runnable validation steps reviewready pull request notes Prerequisites Cursor installed and authenticated Repository check...
Tools: cursor, github-copilot
Updated Feb 25, 2026
Gemini API tool-grounding workflow for coding assistants
Use case Build a coding assistant that answers with grounded, sourcelinked recommendations. Architecture 1. Query intake 2. Context retrieval (repo + docs) 3. Tool execution (search, file reads, tests) 4. Structured ...
Tools: gemini-api, google-gemini-code-assist
Updated Feb 25, 2026
Local coding stack with Continue and Ollama
Why this stack local execution controllable model selection low latency for everyday edits Setup outline 1. Install Ollama and pull coding model(s) 2. Install Continue extension in VS Code or JetBrains 3. Configur...
Tools: continue