wmahfoudh/crabai
Minimal multi-provider LLM CLI. Single static binary, no runtime dependencies. Pipe stdin through any of 8 LLM providers (OpenAI, Anthropic, Google, Groq, and more) and get the response on stdout. Built for Unix pipelines and composable shell workflows.
Stars
—
Forks
—
Language
Rust
License
Apache-2.0
Category
Last pushed
Feb 21, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/wmahfoudh/crabai"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
0xPlaygrounds/rig
⚙️🦀 Build modular and scalable LLM Applications in Rust
Abraxas-365/langchain-rust
🦜️🔗LangChain for Rust, the easiest way to write LLM-based programs in Rust
moly-ai/moly-ai
Moly AI: A local + cloud AI LLM multi-platform GUI app in pure Rust
ivangabriele/mistralai-client-rs
Rust client for Mistral AI API.
edgee-ai/rust-sdk
The official Rust library for the Edgee AI Gateway