rishi-raj-jain/sse-streaming-llm-response
Using Server-Sent Events (SSE) to stream LLM responses in Next.js
21
/ 100
Experimental
No commits in the last 6 months.
No License
Stale 6m
No Package
No Dependents
Maintenance
0 / 25
Adoption
5 / 25
Maturity
1 / 25
Community
15 / 25
Stars
10
Forks
5
Language
TypeScript
License
—
Category
Last pushed
May 06, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/rishi-raj-jain/sse-streaming-llm-response"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Azure-Samples/serverless-chat-langchainjs
Build your own serverless AI Chat with Retrieval-Augmented-Generation using LangChain.js,...
67
GitHamza0206/simba
OpenSource Production ready Customer service with built in Evals and monitoring
54
Cocolalilal/LastChat
A Fork of Rikkahub with an overhauled UI and feature additions
54
crawlchat/crawlchat
Turn your documentation into an AI assistant that answers questions instantly
53
Dcup-dev/dcup
Dcup - Advanced RAG for Personal Knowledge ☕
52