Skip to main content
Sequential stages where each phase’s output feeds the next, with context reset between stages. How it works:
  1. Create tasks with dependencies: each stage is blocked by the previous
  2. Each handoff resets context — research noise doesn’t leak into implementation
  3. Each stage writes structured output that the next stage consumes
Best for: research → implementation flows, multi-stage migrations, any workflow where earlier context would bias later steps. A typical four-stage pipeline:
Research (Haiku) → Plan (Opus) → Implement (Sonnet) → Validate (Sonnet, read-only)
Each stage uses the cheapest model that can do the job — see cost and model routing. Division of labor:
StageOrchestratorAgents
ResearchCoordinates topics3-4 parallel researchers
PlanningManages revision loopPlanner + checker until pass
ExecutionGroups into waves, tracksExecutors in parallel, fresh context each
VerificationRoutes next stepVerifier checks code against goals
The orchestrator never does heavy lifting — it spawns, waits, integrates, routes.

Examples in the wild

ExampleWhat it shows
cyber-defense-team4-stage pipeline: log-ingestor → anomaly-detector → risk-classifier → threat-reporter. Each agent reads the previous stage’s JSON and writes new JSON. Includes cost estimates per stage.
gsd:quickComposable pipeline: --discuss, --research, --full each add a stage. Omit what you don’t need.