ollama4j/ollama4j-web-ui
Web UI for Ollama built in Java with Vaadin, Spring Boot and Ollama4j
Provides chat and image-based inference interfaces that communicate with local Ollama servers through the Ollama4j client library. Deployable via Docker, Docker Compose, or standalone JAR with configurable model management and multi-model support through a Vaadin-based component architecture. Supports streaming responses, file uploads up to 50MB, and configurable request timeouts for long-running inference tasks.
123 stars. No commits in the last 6 months.
Stars
123
Forks
31
Language
Java
License
MIT
Category
Last pushed
Mar 19, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ollama4j/ollama4j-web-ui"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ollama/ollama
Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.
jd-opensource/JDOxyGent4J
JDOxyGent4J: The Java sibling of the OxyGent ecosystem.
sammcj/gollama
Go manage your Ollama models
dext7r/ollama-api-pool
๐ Intelligent Ollama API proxy pool based on Cloudflare Workers - ๅบไบ Cloudflare Workers ็ๆบ่ฝ...
nandlabs/golly
golly is a open source library for go