shreyaskarnik/DistiLlama
Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐
Supports document and webpage chat via retrieval-augmented generation (RAG) using client-side embeddings from Transformers.js and vector storage with Voy, eliminating server dependencies. Integrates with Ollama for local LLM inference and uses Mozilla's Readability library to extract clean text before processing, ensuring high-quality summaries and context-aware responses without external API calls.
303 stars. No commits in the last 6 months.
Stars
303
Forks
33
Language
TypeScript
License
MIT
Category
Last pushed
Sep 02, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/shreyaskarnik/DistiLlama"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
undreamai/LLMUnity
Create characters in Unity with LLMs!
Mintplex-Labs/anythingllm-docs
Documentation of AnythingLLM by Mintplex Labs Inc.
bloodworks-io/phlox
Open source, local first AI medical scribe for desktop and web.
mamei16/LLM_Web_search
An extension for oobabooga/text-generation-webui that enables the LLM to search the web
snexus/llm-search
Querying local documents, powered by LLM