zou-group/textgrad

TextGrad: Automatic ''Differentiation'' via Text -- using large language models to backpropagate textual gradients. Published in Nature.

67
/ 100
Established

Provides a PyTorch-like autograd API for optimizing text through LLM-generated feedback, using textual gradients instead of numerical ones. Supports multiple LLM backends via LiteLLM integration (GPT-4o, Bedrock, Gemini, Together) with optional caching, enabling optimization of diverse variables including prompts, code, reasoning chains, and molecular structures. Built around three core components—Variables, TextLoss functions, and TGD optimizer—that mirror PyTorch's computational graph paradigm for natural language optimization tasks.

3,416 stars and 22,439 monthly downloads. Used by 1 other package. No commits in the last 6 months. Available on PyPI.

Stale 6m
Maintenance 2 / 25
Adoption 21 / 25
Maturity 25 / 25
Community 19 / 25

How are scores calculated?

Stars

3,416

Forks

281

Language

Python

License

MIT

Last pushed

Jul 25, 2025

Monthly downloads

22,439

Commits (30d)

0

Dependencies

12

Reverse dependents

1

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/zou-group/textgrad"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.