ContextualAI/gritlm

Generative Representational Instruction Tuning

56
/ 100
Established

Unifies text embedding and generation in a single model through instruction-based task routing, eliminating the need for separate retrieval and generation models. Built on instruction-tuned language models with pooling mechanisms for dense representations, it supports both zero-shot and instruction-conditioned inference across retrieval-augmented generation, semantic search, and language generation tasks. Available in multiple scales (7B to 8x7B), the framework integrates with Hugging Face's ecosystem and provides caching optimizations for efficient batch processing.

688 stars and 12,353 monthly downloads. Used by 1 other package. No commits in the last 6 months. Available on PyPI.

Stale 6m
Maintenance 2 / 25
Adoption 20 / 25
Maturity 18 / 25
Community 16 / 25

How are scores calculated?

Stars

688

Forks

49

Language

Jupyter Notebook

License

MIT

Last pushed

Jun 25, 2025

Monthly downloads

12,353

Commits (30d)

0

Dependencies

5

Reverse dependents

1

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/ContextualAI/gritlm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.