zou-group/textgrad
TextGrad: Automatic ''Differentiation'' via Text -- using large language models to backpropagate textual gradients. Published in Nature.
Provides a PyTorch-like autograd API for optimizing text through LLM-generated feedback, using textual gradients instead of numerical ones. Supports multiple LLM backends via LiteLLM integration (GPT-4o, Bedrock, Gemini, Together) with optional caching, enabling optimization of diverse variables including prompts, code, reasoning chains, and molecular structures. Built around three core components—Variables, TextLoss functions, and TGD optimizer—that mirror PyTorch's computational graph paradigm for natural language optimization tasks.
3,416 stars and 22,439 monthly downloads. Used by 1 other package. No commits in the last 6 months. Available on PyPI.
Stars
3,416
Forks
281
Language
Python
License
MIT
Category
Last pushed
Jul 25, 2025
Monthly downloads
22,439
Commits (30d)
0
Dependencies
12
Reverse dependents
1
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/zou-group/textgrad"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
ArikReuter/TopicGPT
TopicGPT allows to integrate the benefits of LLMs into Topic Modelling
cattolatte/zenith-nlp-framework
A comprehensive toolkit to build, train, and deploy modern NLP models from the ground up....
emory-courses/nlp-essentials
Foundations of Modern NLP
WangRongsheng/IvyGPT
[CICAI 2023] The official codes for "Ivygpt: Interactive chinese pathway language model in...
FMXExpress/Song-Writer-AI
Write lyriced songs using AI via LLMs like GPT-3.5-Turbo and Vicuna-13b.