shreyaskarnik/DistiLlama

Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐

42
/ 100
Emerging

Supports document and webpage chat via retrieval-augmented generation (RAG) using client-side embeddings from Transformers.js and vector storage with Voy, eliminating server dependencies. Integrates with Ollama for local LLM inference and uses Mozilla's Readability library to extract clean text before processing, ensuring high-quality summaries and context-aware responses without external API calls.

303 stars. No commits in the last 6 months.

Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

303

Forks

33

Language

TypeScript

License

MIT

Last pushed

Sep 02, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/rag/shreyaskarnik/DistiLlama"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.