tyrell/llm-ollama-llamaindex-bootstrap-ui
This is a LlamaIndex project bootstrapped with create-llama to act as a full stack UI to accompany Retrieval-Augmented Generation (RAG) Bootstrap Application.
Loads pre-built vector store indexes from a separate RAG pipeline and queries them with streaming enabled, separating the indexing and query layers. The full-stack architecture pairs a Python backend (FastAPI/LlamaIndex) that handles index loading and retrieval with a Node.js/React frontend served on port 3000, allowing real-time conversational interaction with indexed documents. Designed to work with locally-hosted LLMs via Ollama while leveraging LlamaIndex's data connectors and query abstractions.
No commits in the last 6 months.
Stars
32
Forks
14
Language
TypeScript
License
—
Category
Last pushed
Feb 23, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/generative-ai/tyrell/llm-ollama-llamaindex-bootstrap-ui"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
apache/burr
Build applications that make decisions (chatbots, agents, simulations, etc...). Monitor, trace,...
eliasdabbas/chatnificent
Chatnificent: LLM chat app framework – Minimally complete. Maximally hackable.
intentional-ai/intentional
Intentional is an open-source framework to build reliable LLM chatbots that actually talk and...
surveychat/surveychat
Conversational AI tools for researchers
kunalsuri/kllama
✅🦙 Kllama: Your Local & Private Chatbot :dependabot: