im-pramesh10/LocalPrompt
"A simple and lightweight client-server program for interfacing with local LLMs using ollama, and LLMs in groq using groq api."
No commits in the last 6 months.
Stars
1
Forks
—
Language
JavaScript
License
MIT
Category
Last pushed
Aug 08, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/im-pramesh10/LocalPrompt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mishushakov/llm-scraper
Turn any webpage into structured data using LLMs
Mobile-Artificial-Intelligence/maid
Maid is a free and open source application for interfacing with llama.cpp models locally, and...
run-llama/LlamaIndexTS
Data framework for your LLM applications. Focus on server side solution
nuance1979/llama-server
LLaMA Server combines the power of LLaMA C++ with the beauty of Chatbot UI.
JHubi1/ollama-app
A modern and easy-to-use client for Ollama