Warp Goes Open Source (April 28, 2026): AGPL Client, OpenAI as Founding Sponsor, Oz-Driven Contributions
Warp open-sourced the Warp client under AGPL on April 28, 2026 at github.com/warpdotdev/warp, with OpenAI as the founding sponsor, expanded support for Kimi/MiniMax/Qwen plus an 'auto (open)' router, and a new Oz-agent-driven contribution workflow.
Editorial Team
The AI Coding Tools Directory editorial team researches and reviews AI-powered development tools to help developers find the best solutions for their workflows.
On April 28, 2026, Warp announced that the Warp client is now open source. The repo lives at github.com/warpdotdev/warp under an AGPL license. OpenAI is the founding sponsor, and Warp is rolling out a contribution model where the heavy lifting is done by Oz agents, with the community handling ideas, direction, and verification rather than submitting traditional pull requests. Warp also expanded support for open-weight models — Kimi, MiniMax, Qwen, plus an auto (open) router that picks an open model per task.
TL;DR
- April 28, 2026: Warp client is now open source at github.com/warpdotdev/warp under AGPL.
- OpenAI is the founding sponsor; GPT models power the Oz-driven contribution workflow.
- Contribution model: agents do the implementation; community provides ideas, direction, and verification — not the traditional PR loop.
- New open-weight model support: Kimi, MiniMax, Qwen, plus an
auto (open)router for per-task open-model selection.- What stays proprietary: the announcement focuses on the client; Oz cloud orchestration is referenced as a separate platform.
Quick Answer
Warp's bet here is twofold. (1) Open-source the client — the part that runs on your machine, the rendering, the terminal core, the agent UX — under AGPL, where derivative network services have to share back. (2) Reframe contribution itself: instead of "PR welcome", contributors describe what should change and an Oz-style agent does the implementation, with maintainers and the community in the verification loop. If you're an agent-tooling builder, the AGPL client is now a reference implementation you can read end to end.
What Was Open-Sourced
"Warp's source code is now available at github.com/warpdotdev/warp with an AGPL license."
The artifact that is open is the Warp client — the GPU-accelerated terminal app you install on your machine. The AGPL choice is meaningful: AGPL extends copyleft to network use, so a hosted service built on the Warp client codebase would have to publish its source under the same terms. That's the same lever MongoDB, Grafana, and others have used to keep cloud providers from re-hosting an open core without giving back.
What is not the same artifact: Oz, Warp's cloud agent platform. The announcement references Oz as the engine driving the new contribution workflow, but Oz cloud orchestration is presented as a separate platform — not the code being open-sourced today. If you previously read our Warp Oz guide, the cloud side described there is unchanged in scope.
OpenAI as the Founding Sponsor
OpenAI is the founding sponsor of the open-source effort. The visible mechanism is that GPT models power the agent workflows Warp is using to land community contributions. There's no exclusivity claim in the announcement — Warp continues to support a multi-model lineup — but the partnership is what funds the agent-first contribution loop rather than a pure volunteer model.
For context on where OpenAI's coding-agent stack stands right now, see our OpenAI Codex April 2026 update.
Cloud coding agent with GPT-5.5 frontier model, 1M+ developers, Desktop App, in-app browser use, and parallel sandboxed environments
The Oz-Driven Contribution Workflow
Warp's contribution model is the most unusual part of the announcement. From the post:
"We want agents doing the heavy lifting (coding, planning, testing, etc.) and community members helping with ideas, direction and verification."
Concretely, that means:
- Community contributors open GitHub issues describing intent, scope, and acceptance criteria.
- Oz agents do the planning, implementation, and testing.
- Warp maintainers and community reviewers verify, redirect, and accept work — collective knowledge management rather than a traditional PR queue.
If this works as described, Warp will be one of the first mainstream open-source projects to make agent-implemented, human-verified the default path rather than a side experiment. It also pairs naturally with the AGPL choice: the more the implementation is agent-produced and shared, the higher the leverage of keeping it under copyleft.
Wider Open-Weight Model Support
In the same release, Warp expanded its model lineup with three open-weight options and a new auto-router:
| Model | Notes |
|---|---|
| Kimi | Newly supported in Warp |
| MiniMax | Newly supported in Warp |
| Qwen | Newly supported in Warp |
auto (open) |
Router that selects an optimal open model per task |
The auto (open) mode mirrors the per-task routing pattern used by Cursor's auto mode and Claude Code's auto mode for Max users, but constrained to open-weight models. For developers in regulated environments where data residency or model provenance matters, that's a meaningful lever.
The AI-native code editor with $1B+ ARR, 25+ models, and background agents on dedicated VMs
Anthropic's terminal-based AI coding agent with Claude Opus 4.7, /ultrareview, Routines, /ultraplan, and 80.9% SWE-bench
Why This Matters for AI Coding Tools
A few signals worth pulling out:
- The AGPL terminal client is now a reference implementation. Anyone building an agentic terminal — or evaluating one for a regulated stack — has a real codebase to read instead of inferring behavior from screenshots.
- Agent-first contribution lowers the barrier from "I can write Rust + know your codebase" to "I can describe what I want and verify what came back." That widens the contributor pool, but only if the verification loop is rigorous enough to keep quality high.
- The OpenAI sponsorship is a notable cross-vendor move: OpenAI underwriting an open-source agentic terminal that supports many other model vendors, including direct competitors.
- AGPL is the right hammer for the cloud-fork problem, but it also raises the bar for enterprises that want to embed Warp's client code in their own paid tools — something to check with your license team before forking.
How This Compares
| Capability | Warp (Apr 28, 2026) | Claude Code | Cursor |
|---|---|---|---|
| Client source available | AGPL on GitHub | Closed | Closed |
| Open-weight model support | Kimi, MiniMax, Qwen, auto (open) |
Limited (via API providers) | Limited |
| Contribution model | Agent-implemented, human-verified | Closed | Closed |
| Founding model sponsor | OpenAI (GPT) | n/a | n/a |
Sources
- Warp blog — Warp Is Now Open Source (April 28, 2026): warp.dev/blog/warp-is-now-open-source
- Warp client source: github.com/warpdotdev/warp
For broader context on agentic terminal tools see our Warp Oz guide, AI CLI coding tools roundup, and best open-source AI coding tools.
Tools Mentioned in This Article
Claude Code
Anthropic's terminal-based AI coding agent with Claude Opus 4.7, /ultrareview, Routines, /ultraplan, and 80.9% SWE-bench
SubscriptionContinue
Open-source, model-agnostic AI coding assistant for VS Code and JetBrains
Open SourceCursor
The AI-native code editor with $1B+ ARR, 25+ models, and background agents on dedicated VMs
FreemiumGPT-5
OpenAI's first unified reasoning model: 70.1% SWE-bench, 400K context, and $1.25/$10 per MTok
Pay-per-useOpenAI Codex
Cloud coding agent with GPT-5.5 frontier model, 1M+ developers, Desktop App, in-app browser use, and parallel sandboxed environments
FreemiumWarp
AI-native terminal with Oz cloud agent orchestration, Warp Drive, and Terminal-Bench #1 performance
FreemiumFree Resource
2026 AI Coding Tools Comparison Chart
Side-by-side comparison of features, pricing, and capabilities for every major AI coding tool.
No spam, unsubscribe anytime.
Workflow Resources
Cookbook
AI-Powered Code Review & Quality
Automate code review and enforce quality standards using AI-powered tools and agentic workflows.
Cookbook
Building AI-Powered Applications
Build applications powered by LLMs, RAG, and AI agents using Claude Code, Cursor, and modern AI frameworks.
Cookbook
Building APIs & Backends with AI Agents
Design and build robust APIs and backend services with AI coding agents, from REST to GraphQL.
Cookbook
Debugging with AI Agents
Systematically debug complex issues using AI coding agents with structured workflows and MCP integrations.
MCP Server
AWS MCP Server
Interact with AWS services including S3, Lambda, CloudWatch, and ECS from your AI coding assistant.
MCP Server
Context7 MCP Server
Fetch up-to-date library documentation and code examples directly into your AI coding assistant.
MCP Server
Docker MCP Server
Manage Docker containers, images, and builds directly from your AI coding assistant.
MCP Server
Figma MCP Server
Access Figma designs, extract design tokens, and generate code from your design files.
Frequently Asked Questions
What did Warp open-source on April 28, 2026?
What license is Warp using?
Who is the founding sponsor?
How are contributions made — is it normal pull requests?
Did Warp's pricing change?
Which open-source models are now supported in Warp?
Related Articles
Gemini API April 2026 Update: Flex/Priority Tiers, Deep Research with MCP, embedding-2 GA, gemini-3.1-flash-tts-preview
April 2026 in the Gemini API: new Flex and Priority inference tiers (Apr 1), gemini-3.1-flash-tts-preview (Apr 15), Deep Research updates with MCP server integration and File Search (Apr 21), gemini-embedding-2 GA (Apr 22), and the gemini-robotics-er-1.5-preview shutdown (Apr 30).
Read more →NewsCursor SDK (April 29, 2026): Build Programmatic Agents with the Same Runtime That Powers Cursor
Cursor launched a TypeScript SDK on April 29, 2026 that exposes the same agent runtime, harness, and models that power the Cursor IDE. Run agents locally or on Cursor's cloud, integrate them into your own apps, and reach for any frontier model behind a single interface.
Read more →NewsClaude Code v2.1.120 → v2.1.123 (April 28–29, 2026): claude ultrareview in CI, Windows Without Git Bash, MCP alwaysLoad, plugin prune
Claude Code's late-April 2026 point releases (v2.1.120–v2.1.123) make /ultrareview runnable from CI as a non-interactive subcommand, drop the Git Bash requirement on Windows in favor of PowerShell, add an alwaysLoad option for MCP servers, and ship claude plugin prune for orphaned plugin dependencies.
Read more →