upstash/rag-chat
Prototype SDK for RAG development.
Provides out-of-the-box ingestion for websites, PDFs, and other content sources, with built-in vector storage and optional Redis-backed chat history. Supports streaming in Next.js and integrates with multiple LLM providers (OpenAI, Anthropic, GROQ, Mistral, Ollama) plus observability platforms like Helicone and Langsmith. Optional features include rate limiting and a `disableRag` mode for LLM-only chat applications.
258 stars. Available on npm.
Stars
258
Forks
57
Language
TypeScript
License
MIT
Category
Last pushed
Dec 15, 2025
Commits (30d)
0
Dependencies
15
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/upstash/rag-chat"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
merefield/discourse-chatbot
An AI bot with RAG capability for Topics and Chat in Discourse, currently powered by OpenAI
vercel-labs/ai-sdk-preview-rag
Retrieval-augmented generation (RAG) template powered by the AI SDK.
ajac-zero/example-rag-app
Open-Source RAG app with LLM Observability (Langfuse), support for 100+ providers (LiteLLM),...
rag-web-ui/rag-web-ui
RAG Web UI is an intelligent dialogue system based on RAG (Retrieval-Augmented Generation) technology.
skaldlabs/skald
Context layer platform in your infrastructure