jakobhoeg/nextjs-ollama-llm-ui
Fully-featured web interface for Ollama LLMs
Built on Next.js with local browser storage instead of a database, it communicates directly with Ollama's API and supports model management (download/delete), vision capabilities, voice input, and code syntax highlighting with one-click copying. The interface uses shadcn-ui components and Framer Motion animations for a ChatGPT-like experience across desktop and mobile without requiring external services.
1,415 stars. No commits in the last 6 months.
Stars
1,415
Forks
333
Language
TypeScript
License
MIT
Category
Last pushed
Jun 05, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/jakobhoeg/nextjs-ollama-llm-ui"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
SillyTavern/SillyTavern
LLM Frontend for Power Users.
chyok/ollama-gui
A single-file tkinter-based Ollama GUI project with no external dependencies.
libre-webui/libre-webui
Privacy-first web interface for local AI models. Clean, minimal UI for Ollama with extensible...
matlab-deep-learning/llms-with-matlab
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions, Azure® OpenAI Services, and Ollama™
ollama4j/ollama4j
A simple Java library for interacting with Ollama server.