rag-chat and rag-web-ui

These are complements: the SDK provides the RAG backend logic and API while the web UI consumes that backend to deliver the conversational interface to end users.

rag-chat
63
Established
rag-web-ui
52
Established
Maintenance 6/25
Adoption 10/25
Maturity 25/25
Community 22/25
Maintenance 6/25
Adoption 10/25
Maturity 16/25
Community 20/25
Stars: 258
Forks: 57
Downloads:
Commits (30d): 0
Language: TypeScript
License: MIT
Stars: 2,818
Forks: 293
Downloads:
Commits (30d): 0
Language: TypeScript
License: Apache-2.0
No risk flags
No Package No Dependents

About rag-chat

upstash/rag-chat

Prototype SDK for RAG development.

Provides out-of-the-box ingestion for websites, PDFs, and other content sources, with built-in vector storage and optional Redis-backed chat history. Supports streaming in Next.js and integrates with multiple LLM providers (OpenAI, Anthropic, GROQ, Mistral, Ollama) plus observability platforms like Helicone and Langsmith. Optional features include rate limiting and a `disableRag` mode for LLM-only chat applications.

About rag-web-ui

rag-web-ui/rag-web-ui

RAG Web UI is an intelligent dialogue system based on RAG (Retrieval-Augmented Generation) technology.

Supports multiple document formats (PDF, DOCX, Markdown, Text) with async processing and automatic chunking, while offering flexible LLM integration via OpenAI, DeepSeek, or local Ollama deployment. Built on Python FastAPI backend with ChromaDB/Qdrant vector databases, MinIO distributed storage, and Langchain framework; provides OpenAPI interfaces for programmatic knowledge base access alongside a frontend-backend separated architecture enabling multi-turn contextual dialogue with citation tracking.

Scores updated daily from GitHub, PyPI, and npm data. How scores work