ralph-orchestrator and wiggum-cli
The AI agent that plugs into any codebase (B) is a client for an improved implementation of the Ralph Wiggum technique for autonomous AI agent orchestration (A), making them ecosystem siblings.
About ralph-orchestrator
mikeyobrien/ralph-orchestrator
An improved implementation of the Ralph Wiggum technique for autonomous AI agent orchestration
Implements a hat-based persona system with backpressure gates (tests, lint, typecheck) that coordinate through events, supporting multiple LLM backends (Claude, Gemini, Copilot CLI) and persistent memories. Runs as a Rust RPC API with web dashboard, MCP server over stdio, or CLI; includes human-in-the-loop via Telegram for agent questions and proactive guidance during orchestration loops.
About wiggum-cli
federiconeri/wiggum-cli
AI agent that plugs into any codebase — scans your stack, generates specs through AI interviews, and runs Ralph loops.
Supports detection of 80+ technologies across frameworks, databases, ORMs, and testing tools, generating stack-specific prompts and implementation guides. Uses a multi-agent orchestration system—planning, parallel context enrichment, synthesis, and evaluation—to produce tailored specs and configurations stored in `.ralph/`. Hands off feature implementation to Claude Code or any CLI agent, orchestrating autonomous implement-test-fix loops with git worktree isolation and manual approval checkpoints.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work